Skip to content

Fix memory not being freed after point cloud decimation#342

Merged
JanuszBedkowski merged 1 commit intoMapsHD:mainfrom
bloom256:fix/memory_usage_in_point_cloud_decimate
Feb 7, 2026
Merged

Fix memory not being freed after point cloud decimation#342
JanuszBedkowski merged 1 commit intoMapsHD:mainfrom
bloom256:fix/memory_usage_in_point_cloud_decimate

Conversation

@bloom256
Copy link
Copy Markdown
Collaborator

@bloom256 bloom256 commented Feb 6, 2026

Problem

When decimating point clouds, the decimate() function creates new smaller vectors and assigns them to the member vectors. However, std::vector assignment does not reduce capacity - it keeps the original memory allocation even when the new data is much smaller.

This meant that even with aggressive decimation, the application retained the full memory footprint of the original undecimated point clouds.

Solution

Added shrink_to_fit() calls after each vector assignment in PointCloud::decimate() to release unused memory back to the system.

Results

Tested on multi_view_tls_registration_step_2.exe with a 5km dataset using 0.2m decimation. Before the fix, the application used 80GB of RAM. After the fix, it uses only 10GB - an 88% reduction in memory usage. (Tested on Windows with msvc compiler)

Files changed

  • core/src/point_cloud.cpp - Added shrink_to_fit() to 6 vectors in decimate()

@JanuszBedkowski
Copy link
Copy Markdown
Member

wow , Yau are amaizing!

@JanuszBedkowski JanuszBedkowski merged commit e5872a2 into MapsHD:main Feb 7, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants